Cerebras-GPT 2.7B is a language model based on the Transformer architecture, aiming to support the research of large language models and can serve as a basic model in fields such as natural language processing.
Large Language Model
Transformers English